5 research outputs found

    OCRTOC: A Cloud-Based Competition and Benchmark for Robotic Grasping and Manipulation

    No full text
    In this paper, we propose a cloud-based benchmark for robotic grasping and manipulation, called the OCRTOC benchmark. The benchmark focuses on the object rearrangement problem, specifically table organization tasks. We provide a set of identical real robot setups and facilitate remote experiments of standardized table organization scenarios in varying difficulties. In this workflow, users upload their solutions to our remote server and their code is executed on the real robot setups and scored automatically. After each execution, the OCRTOC team resets the experimental setup manually. We also provide a simulation environment that researchers can use to develop and test their solutions. With the OCRTOC benchmark, we aim to lower the barrier of conducting reproducible research on robotic grasping and manipulation and accelerate progress in this field. Executing standardized scenarios on identical real robot setups allows us to quantify algorithm performances and achieve fair comparisons. Using this benchmark we held a competition in the 2020 International Conference on Intelligence Robots and Systems (IROS 2020). In total, 59 teams took part in this competition worldwide. We present the results and our observations of the 2020 competition, and discuss our adjustments and improvements for the upcoming OCRTOC 2021 competition
    corecore